منابع مشابه
Probabilistic automata simulation with single layer weightless neural networks
Computability of weightless neural networks is the major topic of this paper. In previous works it has been shown that, one can simulate a Turing machine with a weightless neural network (WNN) with an infinite tape. And it has also been shown that one can simulate probabilistic automata with a WNN with two queues. In this paper, we will show that is possible to simulate a probabilistic automata...
متن کاملSEISMIC DESIGN OF DOUBLE LAYER GRIDS BY NEURAL NETWORKS
The main contribution of the present paper is to train efficient neural networks for seismic design of double layer grids subject to multiple-earthquake loading. As the seismic analysis and design of such large scale structures require high computational efforts, employing neural network techniques substantially decreases the computational burden. Square-on-square double layer grids with the va...
متن کاملClassification ability of single hidden layer feedforward neural networks
Multilayer perceptrons with hard-limiting (signum) activation functions can form complex decision regions. It is well known that a three-layer perceptron (two hidden layers) can form arbitrary disjoint decision regions and a two-layer perceptron (one hidden layer) can form single convex decision regions. This paper further proves that single hidden layer feedforward neural networks (SLFN's) wit...
متن کاملOn Binary Classification with Single-Layer Convolutional Neural Networks
Convolutional neural networks are becoming standard tools for solving object recognition and visual tasks. However, most of the design and implementation of these complex models are based on trail-and-error. In this report, the main focus is to consider some of the important factors in designing convolutional networks to perform better. Specifically, classification with wide single–layer networ...
متن کاملFirst-order versus second-order single-layer recurrent neural networks
We examine the representational capabilities of first-order and second-order single-layer recurrent neural networks (SLRNN's) with hard-limiting neurons. We show that a second-order SLRNN is strictly more powerful than a first-order SLRNN. However, if the first-order SLRNN is augmented with output layers of feedforward neurons, it can implement any finite-state recognizer, but only if state-spl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Modeling, Control and Information Technologies
سال: 2019
ISSN: 2707-1049,2707-1030
DOI: 10.31713/mcit.2019.70